Capacity, mutual information, and coding for finite-state Markov channels
نویسندگان
چکیده
The Finite-State Markov Channel (FSMC) is a discrete time-varying channel whose variation is determined by a finite-state Markov process. These channels have memory due to the Markov channel variation. We obtain the FSMC capacity as a function of the conditional channel state probability. We also show that for i.i.d. channel inputs, this conditional probability converges weakly, and the channel’s mutual information is then a closed-form continuous function of the input distribution. We next consider coding for FSMC’s. In general, the complexity of maximum-likelihood decoding grows exponentially with the channel memory length. Therefore, in practice, interleaving and memoryless channel codes are used. This technique results in some performance loss relative to the inherent capacity of channels with memory. We propose a maximum-likelihood decision-feedback decoder with complexity that is independent of the channel memory. We calculate the capacity and cutoff rate of our technique, and show that it preserves the capacity of certain FSMC’s. We also compare the performance of the decision-feedback decoder with that of interleaving and memoryless channel coding on a fading channel with 4PSK modulation. Index Terns-Finite-state Markov channels, capacity, mutual information, decision-feedback maximum-likelihood decoding.
منابع مشابه
Capacity, Mutual Information, and Coding for Finite-State Markov Channels - Information Theory, IEEE Transactions on
AbstructThe Finite-State Markov Channel (FSMC) is a discrete time-varying channel whose variation is determined by a finite-state Markov process. These channels have memory due to the Markov channel variation. We obtain the FSMC capacity as a function of the conditional channel state probability. We also show that for i.i.d. channel inputs, this conditional probability converges weakly, and the...
متن کاملOn Entropy and Lyapunov Exponents for Finite-State Channels
The Finite-State Markov Channel (FSMC) is a time-varying channel having states that are characterized by a finite-state Markov chain. These channels have infinite memory, which complicates their capacity analysis. We develop a new method to characterize the capacity of these channels based on Lyapunov exponents. Specifically, we show that the input, output, and conditional entropies for this ch...
متن کاملEntropy and Mutual Information for Markov Channels with General Inputs
We study new formulas based on Lyapunov exponents for entropy, mutual information, and capacity of finite state discrete time Markov channels. We also develop a method for directly computing mutual information and entropy using continuous state space Markov chains. Our methods allow for arbitrary input processes and channel dynamics, provided both have finite memory. We show that the entropy ra...
متن کاملCapacity of Finite State Markov Channels with General Inputs
We study new formulae based on Lyapunov exponents for entropy, mutual information, and capacity of finite state discrete time Markov channels. We also develop a method for directly computing mutual information and entropy using continuous state space Markov chains. Our methods allow for arbitrary input processes and channel dynamics, provided both have finite memory. We show that the entropy ra...
متن کاملA Generalized Blahut-Arimoto Algorithm
Kavčić proposed in [1] an algorithm that optimizes the parameters of a Markov source at the input to a finite-state machine channel in order to maximize the mutual information rate. Numerical results for several channels indicated that his algorithm gives capacity-achieving input distributions. In this paper we prove that the stationary points of this algorithm indeed correspond one-to-one to t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- IEEE Trans. Information Theory
دوره 42 شماره
صفحات -
تاریخ انتشار 1996